S-DIGing: A Stochastic Gradient Tracking Algorithm for Distributed Optimization

نویسندگان

چکیده

In this article, we study convex optimization problems where agents of a network cooperatively minimize the global objective function which consists multiple local functions. The intention work is to solve large-scale complicated, and numerous. Different from most existing works, each agent presented as average finite instantaneous Integrating gradient tracking algorithm with stochastic averaging technology, distributed (termed S-DIGing) proposed. At time instant, only one randomly selected an computed, applied approximate batch for agent. Based on novel primal-dual interpretation S-DIGing algorithm, it shown that linearly converges optimal solution when step-size do not exceed explicit upper bound, functions are strongly Lipschitz continuous gradients. Numerical experiments demonstrate practicability correctness theoretical results.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Distributed Stochastic Gradient Tracking Method

In this paper, we study the problem of distributed multi-agent optimization over a network, where each agent possesses a local cost function that is smooth and strongly convex. The global objective is to find a common solution that minimizes the average of all cost functions. Assuming agents only have access to unbiased estimates of the gradients of their local cost functions, we consider a dis...

متن کامل

Stochastic Recursive Gradient Algorithm for Nonconvex Optimization

In this paper, we study and analyze the mini-batch version of StochAstic Recursive grAdient algoritHm (SARAH), a method employing the stochastic recursive gradient, for solving empirical loss minimization for the case of nonconvex losses. We provide a sublinear convergence rate (to stationary points) for general nonconvex functions and a linear convergence rate for gradient dominated functions,...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

Asynchronous Distributed Semi-Stochastic Gradient Optimization

With the recent proliferation of large-scale learning problems, there have been a lot of interest on distributed machine learning algorithms, particularly those that are based on stochastic gradient descent (SGD) and its variants. However, existing algorithms either suffer from slow convergence due to the inherent variance of stochastic gradients, or have a fast linear convergence rate but at t...

متن کامل

A Fast Distributed Stochastic Gradient Descent Algorithm for Matrix Factorization

The accuracy and effectiveness of matrix factorization technique were well demonstrated in the Netflix movie recommendation contest. Among the numerous solutions for matrix factorization, Stochastic Gradient Descent (SGD) is one of the most widely used algorithms. However, as a sequential approach, SGD algorithm cannot directly be used in the Distributed Cluster Environment (DCE). In this paper...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on emerging topics in computational intelligence

سال: 2022

ISSN: ['2471-285X']

DOI: https://doi.org/10.1109/tetci.2020.3017242